936 research outputs found

    Diagnostic ultrasound should be performed without upper intensity limits

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/135131/1/mp5500.pd

    An All Optical Fibre Quantum Controlled-NOT Gate

    Full text link
    We report the first experimental demonstration of an optical controlled-NOT gate constructed entirely in fibre. We operate the gate using two heralded optical fibre single photon sources and find an average logical fidelity of 90% and an average process fidelity of 0.83<F<0.91. On the basis of a simple model we are able to conclude that imperfections are primarily due to the photon sources, meaning that the gate itself works with very high fidelity.Comment: 4 pages, 4 figures, comments welcom

    Assessing gaps and needs for integrating building performance optimization tools in net zero energy buildings design

    Get PDF
    This paper summarizes a study undertaken to reveal potential challenges and opportunities for integrating optimization tools in net zero energy buildings (NZEB) design. The paper reviews current trends in simulation-based building performance optimization (BPO) and outlines major criteria for optimization tools selection and evaluation. This is based on analyzing user's needs for tools capabilities and requirement specifications. The review is carried out by means of a literature review of 165 publications and interviews with 28 optimization experts. The findings are based on an inter-group comparison between experts. The aim is to assess the gaps and needs for integrating BPO tools in NZEB design. The findings indicate a breakthrough in using evolutionary algorithms in solving highly constrained envelope, HVAC and renewable optimization problems. Simple genetic algorithm solved many design and operation problems and allowed measuring the improvement in the optimality of a solution against a base case. Evolutionary algorithms are also easily adapted to enable them to solve a particular optimization problem more effectively. However, existing limitations including model uncertainty, computation time, difficulty of use and steep learning curve. Some future directions anticipated or needed for improvement of current tools are presented.Peer reviewe

    Envelopment methodology to measure and compare subcontractor productivity at the firm level

    Full text link
    This paper describes a conceptual approach to measure and compare productivity of resource utilization at the firm level, adapting a set of techniques known as Data Envelopment Analysis (DEA). Within this approach, the paper addresses the issues of multiple inputs and multiple outputs of a construction firm, level of detail for data collection, and the required transformations to correct for differences among projects. In particular, we focus on the resource management of subcontractors. Subcontractors manage multiple, concurrent projects and must allocate limited resources across these projects. Interaction between projects and resource allocation creates non-linear effects, and therefore the productivity of the firm is not simply the productivity of its projects. The proposed measurement methodology will allow assessment of the impact of different management policies (including many of those proposed by lean construction researchers) on firm performance. It is hoped that this novel approach to productivity measurement will help subcontractors identify efficient practices and superior management policies, and will promote adoption of these policies.<br /

    Estimating a Path through a Map of Decision Making

    Get PDF
    Studies of the evolution of collective behavior consider the payoffs of individual versus social learning. We have previously proposed that the relative magnitude of social versus individual learning could be compared against the transparency of payoff, also known as the “transparency” of the decision, through a heuristic, two-dimensional map. Moving from west to east, the estimated strength of social influence increases. As the decision maker proceeds from south to north, transparency of choice increases, and it becomes easier to identify the best choice itself and/or the best social role model from whom to learn (depending on position on east–west axis). Here we show how to parameterize the functions that underlie the map, how to estimate these functions, and thus how to describe estimated paths through the map. We develop estimation methods on artificial data sets and discuss real-world applications such as modeling changes in health decisions

    Validated Sandwich ELISA for the Quantification of von Willebrand Factor in Rabbit Plasma

    Get PDF
    von Willebrand Factor (vWF) is a multimeric plasma protein important for platelet plug formation. As part of its haemostatic role, it is released from endothelial cells during vascular stress or injury and is considered an excellent biomarker of endothelial function. Currently, there are no validated kits available to measure vWF in rabbits. We developed a sensitive and reproducible sandwich enzyme-linked immunosorbent assay (ELISA) for detection of vWF in rabbit plasma using commercially available antibodies and reagents. Purified human vWF was used as a calibrator standard with a dynamic range of 1.56–100 ng/mL. The Minimum Required Dilution for rabbit plasma was 1:100. When plasma was spiked with 3.76 or 10 ng/mL vWF, recovery was 108 ± 2% and 93 ± 2%, respectively. Intra- and inter-assay precision for 8 rabbit plasma samples were 3% and 4%, respectively. The Minimum Detectable Concentration was 254 pg/mL for purified human vWF and 1:10,700 dilution of cholesterol-fed rabbit plasma, and the Reliable Detection Limits were 457 pg/mL and 1:5940. Three freeze-thaw cycles significantly decreased vWF concentrations for purified human vWF and 2 of 3 plasma samples assayed. This ELISA provides sensitive and reproducible measurements of rabbit plasma vWF, which is an important biomarker for cardiovascular research

    Asteroids Were Born Big

    Get PDF
    How big were the first planetesimals? We attempt to answer this question by conducting coagulation simulations in which the planetesimals grow by mutual collisions and form larger bodies and planetary embryos. The size frequency distribution (SFD) of the initial planetesimals is considered a free parameter in these simulations, and we search for the one that produces at the end objects with a SFD that is consistent with asteroid belt constraints. We find that, if the initial planetesimals were small (e.g. km-sized), the final SFD fails to fulfill these constraints. In particular, reproducing the bump observed at diameter D~100km in the current SFD of the asteroids requires that the minimal size of the initial planetesimals was also ~100km. This supports the idea that planetesimals formed big, namely that the size of solids in the proto-planetary disk ``jumped'' from sub-meter scale to multi-kilometer scale, without passing through intermediate values. Moreover, we find evidence that the initial planetesimals had to have sizes ranging from 100 to several 100km, probably even 1,000km, and that their SFD had to have a slope over this interval that was similar to the one characterizing the current asteroids in the same size-range. This result sets a new constraint on planetesimal formation models and opens new perspectives for the investigation of the collisional evolution in the asteroid and Kuiper belts as well as of the accretion of the cores of the giant planets.Comment: Icarus (2009) in pres

    Measuring information dependency for construction engineering projects

    Get PDF
    Information dependency may be the most important key for managing information exchange to reduce project risks. Studies to date have not successfully discovered objective and quantitative surrogate to measure information dependency. This paper suggests an approach to measure information dependency with the productivity relationships among various disciplines for heavy industrial engineering projects. As part of a Construction Industry Institute (CII) study, the authors identified the information exchange pattern of engineering disciplines. Based on the patterns, the authors discovered the information dependency that various engineering disciplines had with their productivity relationships and conducted a survey afterwards for validation. Both results show significant and consistent evidence suggesting that: 1) information of equipment and piping disciplines is statistically dependent rather than the other paired disciplines; and 2) productivity relationship can be a legitimate surrogate to measure information dependency between equipment and piping disciplines. As such, this study enlightens a research trajectory for improvement of engineering productivity
    • …
    corecore